Goto

Collaborating Authors

 google tell ai


Google tells AI to explain itself • DEVCLASS

#artificialintelligence

Google has added Explainable AI services to its cloud platform, in an effort to make the decision making processes of machine learning models more transparent to users, and thus build greater trust in the models themselves. Announced on the Google Cloud Blog, the new capability is intended to improve the interpretability of machine learning models. But this is no easy task, as Google admits. Google Cloud AI Explanations takes the approach of quantifying each data factor's contribution to the output of a particular machine learning model to try and assist the human user in understanding why the model made the decisions it did. In other words, it is a far cry from an explanation in layman's terms, and will only really make sense to the data scientists or developers that are building the model in the first place.